Granite-4-Tiny-Preview is a fine-grained Mixture of Experts (MoE) instruction-tuned model with 7 billion parameters, developed based on Granite-4.0-Tiny-Base-Preview, suitable for general instruction-following tasks.
Large Language Model
Transformers